The Second-order Bias and MSE of Quantile Estimators
نویسندگان
چکیده
The finite sample theory using higher order asymptotics provides better approximations of the bias and mean squared error (MSE) for a class of estimators. However, no finite sample theory result is available for the quantile regression and the literature on the quantile regression has been entirely on the first-order asymptotic theory. This paper develops new analytical results on the second-order bias and MSE up to order O ( N−2 ) of the conditional quantile regression estimators, extending the conditional mean regression results in Rilstone, Srivastave and Ullah (1996). First, we provide the general results on the second-order bias and MSE of conditional quantile estimators. The second-order bias result enables an improved bias correction and thus to obtain improved quantile estimation. In particular, we show that the second-order bias are much larger towards the tails of the conditional density than near the median, and therefore the benefit of the second order bias correction is greater when we are interested in the deeper tail quantiles, e.g., for the study of income distribution and financial risk management. The higher order MSE result for the quantile estimation also enables us to better understand the sources of estimation uncertainty. Next, we consider three special cases of the general results, for the unconditional quantile estimation, for the conditional quantile regression with a binary covariate, and for the instrumental variable quantile regression (IVQR). For each of these special cases, we provide the second-order bias and MSE to illustrate their behavior which depends on certain parameters and distributional characteristics. The Monte Carlo simulation indicates that the bias is larger at the extreme low and high tail quantiles, and the second-order bias corrected estimator has better behavior than the uncorrected ones in both conditional and unconditional quantile regression. The second-order bias corrected estimators are numerically much closer to the true estimators of data generating processes. As the higher order bias and MSE decrease as the sample size increases or as the regression error variance decreases, the benefits of the finite sample theory are more apparent when there are larger sampling errors in estimation. In an empirical application, we study the impact of schooling, experience, and tenure on earnings quantiles. We find larger bias at the extreme low and high earnings quantiles. The second-order bias correction also yields the improvement in the quantile prediction especially in tails.
منابع مشابه
A Note on Covariance Matrix Estimation in Quantile Regressions
Abstract This note discusses some issues related to bandwidth selection based on moment expansions of the mean squared error (MSE) of the regression quantile estimator. We use higher order expansions to provide a way to distinguish among asymptotically equivalent nonparametric estimators. We derive approximations to the (standardized) MSE of the covariance matrix estimation. This facilitates a ...
متن کاملMean Squared Error of Empirical Predictor
The term “empirical predictor” refers to a two-stage predictor of a linear combination of fixed and random effects. In the first stage, a predictor is obtained but it involves unknown parameters; thus, in the second stage, the unknown parameters are replaced by their estimators. In this paper, we consider mean squared errors (MSE) of empirical predictors under a general setup, where ML or REML ...
متن کاملA Unified Measure of Uncertainty of Estimated Best Linear Unbiased Predictors in Small Area Estimation Problems
We obtain a second order approximation to the mean squared error (MSE), and its estimate, of the empirical or estimated best linear unbiased predictor (EBLUP) of a mixed effect in a general mixed linear normal model. This covers many important small area models in the literature. Unlike previous research in this area, we provide a unified theory of measuring uncertainty of an EBLUP for a comple...
متن کاملVariable data driven bandwidth choice in nonparametric quantile regression
The choice of a smoothing parameter or bandwidth is crucial when applying nonparametric regression estimators. In nonparametric mean regression various methods for bandwidth selection exists. But in nonparametric quantile regression bandwidth choice is still an unsolved problem. In this paper a selection procedure for local varying bandwidths based on the asymptotic mean squared error (MSE) of ...
متن کاملComparison of Small Area Estimation Methods for Estimating Unemployment Rate
Extended Abstract. In recent years, needs for small area estimations have been greatly increased for large surveys particularly household surveys in Sta­ tistical Centre of Iran (SCI), because of the costs and respondent burden. The lack of suitable auxiliary variables between two decennial housing and popula­ tion census is a challenge for SCI in using these methods. In general, the...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2017